skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Deelman, E"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. The SCEC CyberShake platform implements a repeatable scientific workflow to perform 3D physics-based probabilistic seismic hazard analysis (PSHA). Earlier this year we calculated CyberShake Study 24.8 for the San Francisco Bay Area. Study 24.8 includes both low-frequency and broadband PSHA models, calculated at 315 sites. This study required building a regional velocity model from existing 3D models, with a near-surface low-velocity taper and a minimum Vs of 400 m/s. Pegasus-WMS managed the execution of Study 24.8 for 45 days on the OLCF Frontier and TACC Frontera systems. 127 million seismograms and 34 billion intensity measures were produced and automatically transferred to SCEC storage. Study 24.8 used a HIP language implementation of the AWP-ODC wave propagation code on AMD-GPU Frontier nodes to produce strain Green tensors, which were convolved with event realizations to synthesize seismograms. Seismograms were processed to derive data products such as intensity measures, site-specific hazard curves and regional hazard maps. CyberShake combines 3D low-frequency deterministic (≤1 Hz) simulations with high-frequency calculations using stochastic modules from the Broadband Platform to produce results up to 25 Hz, with validation performed using historical events. New CyberShake data products from this study include vertical seismograms, vertical response spectra, and period-dependent significant durations. The presented results include comparisons of hazard estimates between Study 24.8, the previous CyberShake study for this region (18.8), and the NGA-West2 ground motion models (GMMs). We find that Study 24.8 shows overall lower hazard than 18.8, likely due to changes in rupture coherency, with the exception of a few regions: 24.8 shows higher hazard than both the GMMs and 18.8 at long periods in the Livermore area, due to deepening of the Livermore basin in the velocity model, as well as higher hazard east of San Pablo Bay and south of San Jose. At high frequencies, Study 24.8 hazard is lower than that of the GMMs, reflecting reduced variability in the stochastic components. We are also using CyberShake ground motion data to investigate the effects of preferred rupture directions on site-specific hazard. By default, PSHA hazard products assume all events on a given fault and magnitude are equally likely, but by varying these probabilities we can examine the effects of preferred rupture directions on given faults on CyberShake hazard estimates. 
    more » « less
    Free, publicly-accessible full text available September 10, 2026
  2. IEEE Computer Science (Ed.)
    This poster presents our first steps to define a roadmap to robust science for high-throughput applications used in scientific discovery. These applications combine multiple components into increasingly complex multi-modal workflows that are often executed in concert on heterogeneous systems. The increasing complexity hinders the ability of scientists to generate robust science (i.e., ensuring performance scalability in space and time; trust in technology, people, and infrastructures; and reproducible or confirmable research). Scientists must withstand and overcome adverse conditions such as heterogeneous and unreliable architectures at all scales (including extreme scale), rigorous testing under uncertainties, unexplainable algorithms in machine learning, and black-box methods. This poster presents findings and recommendations to build a roadmap to overcome these challenges and enable robust science. The data was collected from an international community of scientists during a virtual world cafe in February 2021 
    more » « less
  3. IEEE Computer Society (Ed.)
    Scientists using the high-throughput computing (HTC) paradigm for scientific discovery rely on complex software systems and heterogeneous architectures that must deliver robust science (i.e., ensuring performance scalability in space and time; trust in technology, people, and infrastructures; and reproducible or confirmable research). Developers must overcome a variety of obstacles to pursue workflow interoperability, identify tools and libraries for robust science, port codes across different architectures, and establish trust in non-deterministic results. This poster presents recommendations to build a roadmap to overcome these challenges and enable robust science for HTC applications and workflows. The findings were collected from an international community of software developers during a Virtual World Cafe in May 2021. 
    more » « less
  4. null (Ed.)
  5. Software is increasingly important to the scientific enterprise, and science-funding agencies are increasingly funding software work. Accordingly, many different participants need insight into how to understand the relationship between software, its development, its use, and its scientific impact. In this article, we draw on interviews and participant observation to describe the information needs of domain scientists, software component producers, infrastructure providers, and ecosystem stewards, including science funders. We provide a framework by which to categorize different types of measures and their relationships as they reach around from funding, development, scientific use, and through to scientific impact. We use this framework to organize a presentation of existing measures and techniques, and to identify areas in which techniques are either not widespread, or are entirely missing. We conclude with policy recommendations designed to improve insight into the scientific software ecosystem, make it more understandable, and thereby contribute to the progress of science. 
    more » « less